Minimax Nonparametric Classi cation|Part I: Rates of Convergence
نویسنده
چکیده
|This paper studies minimax aspects of nonparametric classi cation. We rst study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f; is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L2 loss is determined by the massiveness of the class as measured by metric entropy. The second part of the paper studies minimax classi cation. The loss of interest is the di erence between the probability of misclassi cation of a classi er and that of the Bayes decision. As is well-known, an upper bound on risk for estimating f gives an upper bound on the risk for classi cation, but the rate is known to be suboptimal for the class of monotone functions. This suggests that one does not have to estimate f well in order to classify well. However, we show that the two problems are in fact of the same di culty in terms of rates of convergence under a su cient condition, which is satis ed by many function classes including Besov (Sobolev), Lipschitz, and bounded variation. This is somewhat surprising in view of a result of Devroye, Gyor , and Lugosi (1996). Index Terms|Conditional probability estimation, mean error probability regret, metric entropy, minimax rates of convergence, nonparametric classi cation, neural network classes, sparse approximation.
منابع مشابه
Minimax Nonparametric Classiication|part I: Rates of Convergence
|This paper studies minimax aspects of nonparametric classiication. We rst study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f; is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L 2 loss is determined by the massiveness of the class as measured by metric entropy. The seco...
متن کاملMinimax nonparametric classification - Part I: Rates of convergence
This paper studies minimax aspects of nonparametric classification. We first study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f , is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L2 loss is determined by the massiveness of the class as measured by metric entropy. The se...
متن کاملMetric Entropy and Minimax Risk in Classi cation
We apply recent results on the minimax risk in density esti mation to the related problem of pattern classi cation The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classi cation of future examples given the classi cation of previously seen examples We give an asymptotic characterization of the minimax risk in terms of the metric entropy p...
متن کاملEstimating the distribution of jumps in regular affine models: uniform rates of convergence
The problem of separating the jump part of a multidimensional regular affine process from its continuous part is considered. In particular, we present an algorithm for a nonparametric estimation of the jump distribution under the presence of a nonzero diffusion component. An estimation methodology is proposed which is based on the log-affine representation of the conditional characteristic func...
متن کاملMinimax rates for nonparametric speci cation testing in regression models
In the context of testing the speci cation of a nonlinear parametric regression function, we study the power of speci cation tests using the minimax approach. We determine the maximum rate at which a set of smooth local alternatives can approach the parametric model while ensuring consistency of a test uniformly against any alternative in this set. We show that a smooth nonparametric testing pr...
متن کامل